約 5,663,573 件
https://w.atwiki.jp/hellishlondon/pages/41.html
Patch 1.2 - Patch Notes (02/11/08) *** PLEASE NOTE THAT THIS PATCH IS NOT ON TEST CENTER AT THIS TIME. *** *** MORE INFORMATION ON ITS ETA IS FORTHCOMING. *** Patch 1.2 Notes February 11, 2008 (2008/02/22)削除 Salutations! (2008/02/22)追加 Hello everyone! (2008/02/22)追加 This patch is has some major skill changes, a few vital balance tweaks, some pesky bug fixes, and a highly-requested new feature. (2008/02/22)追加 First and foremost is the introduction of our in-game mail system. This is something that we know is of vital interest to our community, and is the foundation for our consignment / auction house system. Players can not only send and receive messages with anyone in the game, but they can also attach Palladium and items to their mail. This makes getting items from one character to another – even your own – much simpler and more convenient. And while you need to be in a Station to attach or receive items and Palladium, you can read your messages anywhere, anytime. We’re rolling out the second of our major character class balance passes. In this patch, we’ve done a comprehensive balance pass on the Marksman. We focused on addressing a number of issues that have made certain Marksman builds vastly overpowered in comparison to the rest of the classes and the desired difficulty of the game. There are also numerous areas where we have also increased the effectiveness of or altered skills to give Marksman players a wider variety of powerful builds and tactics. Chief amongst these changes is addressing the notorious skill/weapon combinations involving Ricochet, Reflecting Shot, Multishot, Rapid Fire, Heightened Senses, Beacon, Dead Eye, and Hollow Points, while using a high rate of fire weapon, such as an Arclight Rifle. While we realize that making choices on skill, item, and mod combinations which complement each other is a fundamental design element most emphasized with the Marksman in particular, there should never be one combination of skills which completely outshines the rest. (2008/02/22)削除 We didn’t want to remove the most overpowered Marksman skill combinations altogether, however, and we also didn’t want to leave other Marksman skill issues unaddressed. This would have been largely unfair to the Marksman players. As a result, you’ll see a large number of changes to the skills - such as synergies to the Ballistics, Grenade, and Precision Strike skill groups - that should promote an increased variety of specialized builds and create a more diversified play experience overall. (2008/02/22)追加 We didn’t want to entirely remove the most overpowered Marksman skill combinations altogether, however, and we also didn’t want to leave other Marksman skill issues unaddressed. This would have been largely unfair to the Marksman players. As a result, you’ll see a large number of changes to the skills - such as synergies to the Ballistics, Grenade, and Precision Strike skill groups - that should promote an increased variety of specialized builds and create a more diversified play experience overall. We’ve been doing a lot of balance and tweaking over the past few weeks in Test Center with these changes, and we’re excited to roll them out to all of our players. (2008/02/22)追加 Finally, there are a host of other fixes and additions in this patch, including adjusting the difficulty of the Nightmare end-game and making the big Moloch battle more interesting and challenging. Thanks again to everyone for their great feedback and continued support of making Hellgate London bigger and better. See you online! The Hellgate London Team ----- (2008/02/22)追加 In-Game Mail Characters can now send and receive mail from any other character, including characters within the same account. Mail consists of text messages, and one item of any size and any amount of Palladium can be attached for delivery. Here are some of the features of the mail system. Mail can be read anywhere in the game via the Mail panel. This can be accessed using the Z key or using the Mail icon located underneath the chat panel. Items and Palladium can only be attached or removed from mail whilethe character is in a Station. Mail can be sent to any player whose name is known, regardless of whether they are online or not. You can only send mail to one character at a time or to your entire guild. There is no cap on the number of messages that can be sent or received, but mail is only stored for a certain amount of time. The mail panel shows how long each message has left before it is automatically deleted. Different types of mail (such as those with attachments or unread mail) may stay in your inbox for longer or shorter periods of time. While you can send mail to character in a different mode (Hardcore, Elite, and Hardcore Elite) you cannot attach items or Palladium in a cross-mode email. This follows the same rules as trading items. You can’t mail quest items or non-tradable items (such as Dye Kits or a Skill Retrainer). (2008/02/22)追加 General Adjusted difficulty of the Nightmare endgame. Decreased the ramp of the damage penalty that players face against monsters higher than their level. Fields from different sources can now stack. Please note that the graphics from new fields still override those of previous fields. Players must now wait 5 seconds before they can re-enter a Hellrift. Fixed a bug which caused weapons to disappear when swapping weapon sets in the middle of performing certain skills. Fixed a bug which caused the /played command to display the incorrect amount of time played. Vendor inventories in Nightmare difficulty should now be stocked with Nightmare-level items. The Shock effect now deals damage immediately when it’s applied. Several character and monster animations have been improved. Fixed known Blueprint exploits. Fixed a bug which sometimes caused files to require unnecessary patching. (2008/02/22)追加 Stonehenge Essences or Caste General Heads gathered in Normal and Nightmare difficulties may now be used in either difficulty setting. Moloch has been further increased in difficulty, as well as being more resistant to Ignite and Ignite Damage. Be sure you’re well prepared before attempting to defeat him in Hardcore Elite Mode! Players may no longer exploit the essence pedestals via trading. (2008/02/22)追加 Monsters Winged Imps now lose their invulnerability earlier. Fixed some cases where monsters would get stuck in the ground. (2008/02/22)追加 Quests Fixed a bug which sometimes caused collection side-quest counters to count down under certain conditions. Players can now drop the train part quest items for the “…All the Live Long Day” quest. Also players may no longer pick up these items if they do not have the quest active or if they already have the item. (2008/02/22)追加 User Interface / Controls Some UI panels have been improved. Shift-activation should now work for skills granted by items. The [Shift], [Ctrl], and [Alt] keys are now re-mappable, and may be set as key-modifiers. (2008/02/22)追加 Graphics (DX10 Only) Various distortion effects have been fixed. Fixed a Hellrift portal graphical issue in which the image would slide inappropriately. Translucent models now render properly against the background. Particles and translucent models are no longer out of focus with the depth-of-field effect. Depth-of-field blur amount has been reduced and is now more subtle. Skills (2008/02/22)削除 Please carefully read the changes to skills carefully as many have changed. Some skills now have new names to represent their different effects or make more sense within the context of the character class. (2008/02/22)変更 PLEASE READ the changes to skills carefully as many have changed. Some skills now have new names to represent their different effects or make more sense within the context of the character class. (2008/02/22)移動 Hunter Fixed a bug that sometimes caused grenades to not fire properly when the skill is being repeated. Tactical Stance Now also gives a base Firing Accuracy bonus of 50. Changed the Critical Chance bonus effect (3% at rank 1 and 1% per additional rank) to a damage bonus of 15% at rank 1 and 5% per additional rank. Escape Escape now provides 1 second of invulnerability when it is activated. Escape should now prevent enemy players in PVP from seeing the user. The cool-down for this skill now starts when the skill is used, not when it is cleared. The cool-down for this skill has been increased to 30 seconds. The duration for this skill has been decreased from 6 seconds to 3 seconds at rank 1, increasing by 1 second per additional rank. The rank cap on this skill has been increased from 5 to 7. Escape now clears all other speed boost effects while it is active, including Adrenaline Pills and Sprint. Using any skill or item that boosts speed will clear Escape, including Sprint. Precision Strikes Precision Strikes now have direct synergies with all other Precision Strikes. (2008/02/22)追加 Precision Strikes no longer have skill prerequisites Napalm Strike Increased base Damage by 5%. Ranks in Napalm Strike now increase the Duration of all Precision Strikes by 50%. Smackdown Ranks in Smackdown now increase the Elemental Attack Strengths of all Precision Strikes by 25%. Shock and Awe Decreased base Damage by 7%. Ranks in Shock and Awe now increase the Radius of all Precision Strikes by 15%. Marksman Rebounder Rounds (formerly Ricochet) (2008/02/13)追加 (2008/02/22)移動 "The Marksman uses modified ammunition that has a chance to ricochet off enemies and walls." This skill is now an Active skill. It must be activated to gain the effect and it can be placed in your "fire left", "fire right", or "fire both weapons" slots. There is no power cost for using this skill. This skill has a chance to proc its effect (based on the skill and increasing based on the rank of the skill) for each individual firing of the skill. Since its effect is no longer dependent upon landing a critical hit, the effect of this skills will be seen much more commonly at all skill ranks and character levels. This skill has indirect synergies with the other skills in the "Ballistics" line. Each rank in Rebounder Rounds has a chance (albeit smaller) to proc when either Ravager Rounds or Penetrator Rounds are used.EXAMPLE The first rank of Rebounder Rounds gives a 10% chance for shots to ricochet while using Rebounder Rounds. Due to skill synergy, Ravager Rounds and Penetrator Rounds have a 4% chance for this effect to occur while using either of those skills. Ravager Rounds (formerly Reflected Shot) (2008/02/22)移動 "The Marksman uses modified ammunition that has a chance to blow through and subsequently retarget additional enemies." This skill is now an Active skill. It must be activated to gain the effect and it can be placed in your "fire left", "fire right", or "fire both weapons" slots. There is no power cost for using this skill. This skill has a chance to proc its effect (based on the skill and increasing based on the rank of the skill) for each individual firing of the skill. Since its effect is no longer dependent upon landing a critical hit, the effect of this skills will be seen much more commonly at all skill ranks and character levels. This skill has indirect synergies with the other skills in the "Ballistics" line. Each rank in Ravager Rounds has a chance (albeit smaller) to proc when either Rebounder Rounds or Penetrator Rounds are used.EXAMPLE The first rank of Ravager Rounds gives a 5% chance for shots to blow through and re-target a new enemy while using Ravager Rounds. Due to skill synergy, Rebounder Rounds and Penetrator Rounds have a 2% chance for this effect to occur while using either of those skills. Penetrator Rounds (replaces Homing Shot) (2008/02/22)移動 "The Marksman uses modified ammunition that has a chance to completely ignore the target’s shields." This skill is now an Active skill. It must be activated to gain the effect and it can be placed in your "fire left", "fire right", or "fire both weapons" slots. There is no power cost for using this skill. This skill has a chance to proc its effect (based on the skill and increasing based on the rank of the skill) for each individual firing of the skill. Since its effect is no longer dependent upon landing a critical hit, the effect of this skills will be seen much more commonly at all skill ranks and character levels. This skill has indirect synergies with the other skills in the "Ballistics" line. Each rank in Penetrator Rounds has a chance (albeit smaller) to proc when either Ravager Rounds or Rebounder Rounds are used.EXAMPLE The first rank of Penetrator Rounds gives a 10% chance for shots to ignore shields while using Penetrator Rounds. Due to skill synergy, Ravager Rounds and Rebounder Rounds have a 4% chance for this effect to occur while using either of those skills. Weapon Master (formerly Hollow Points) Increased the Critical Damage bonus of this skill from 20% at rank 1 and 10% per additional rank, to a flat 30% bonus per rank. Beacon Decreased the power cost by 33%. Elemental Beacon Now also decreases the target’s Elemental Attack Strengths. Gives increasing Elemental Attack and Defense penalties per rank that scales with the character’s level. Overshield Increased the Shields bonus per additional rank from 33% to 100%. Elemental Vision Increased the Elemental Attack Strength bonuses from 15% per rank to 25% per rank. Sniper (2008/02/22)追加 Is now available at level 5. No longer forces the Marksman to crouch and stop moving when used. The character now moves at 50% speed while the skill is active. Weapon Accuracy, Range, and Missile Velocity bonuses are now fixed at 60%, 50%, and 25%, respectively. Damage bonus has been decreased from 200% to 150%. Rank progression now decreases its rate of use and rate of fire penalties by 5% per rank, beginning at a 55% penalty at rank 1 and ending at a 10% penalty at rank 10. Master Sniper Now provides a 1% Critical Chance bonus per rank, in addition to its previous 20% Critical Damage bonus. Rapid Fire Now clears Multishot when used. Multishot This skill has been moved under Weapon Master in the skill tree. It required two points in Weapon Master to unlock, thus requiring fewer total points to unlock than before. Escape Artist The Movement Speed bonus has been increased to 25% per rank. The rank cap on this skill has been increased from 5 to 7. Grenades Grenades now have direct synergies with all other Grenades. Explosive Grenade Explosive Grenades now explode on impact with monsters (but not objects) and have a 2 second fuse. Decreased base Damage by 23%. Decreased base Ignite Attack Strength by 33%. Ranks in Explosive Grenade now increase the Damage of all grenades by 10%. Phase Grenade Increased base Damage by 6%. Ranks in Phase Grenade now increase the Elemental Attack Strengths of all grenades by 30%. Toxic Grenade Toxic Grenades now bounce off of objects (ignoring monsters) before exploding after 1.5 seconds. Toxic Grenades now create a field that lasts for 4 seconds, dealing field damage. Decreased base Poison Attack Strength by 50%. Ranks in Toxic Grenade now increase the splash or field radius of all grenades by 10%. Flashcracker Grenade Increased base Damage by 12%. Decreased base Stun Attack Strength by 50%. Ranks in Flashcracker Grenade now grant and increase the Shield Penetration of all grenades by 10%. 2008/02/22 追加 Engineer Engineer Drones now properly receive Armor bonuses from Armor affixes. Fixed bugs that caused Engineer Drones to receive more than twice as much Armor than intended. Engineers should no longer erroneously receive bonuses from items which grant the Overshield skill. Summoned pets should no longer fail to spawn when summoned. 2008/02/22 追加 Templar Guardian Aura of Thorns Aura of Thorns should now properly increase Thorns damage from items. Aura of Thorns has been rebalanced to fit the standard Aura progression model. 2008/02/22 追加 Blademaster Crosscutter Crosscutter now requires a target in order to use the skill. 2008/02/22 追加 Cabalist Fixed description for Brom’s Curse. 2008/02/22 追加 Summoner Hand of Nostrum should now properly increase the number of healing beams used by the Witch Doctor. 2008/02/22 追加 Evoker Hellfire can now be aimed as long as the skill trigger is held down, and fires once the trigger is released. Patch Notes Disclaimer While we make every effort to include all upcoming changes in our patch notes, please be aware that occasionally some changes are unintentionally omitted. As mentioned in the opening letter, this is the second sweep of class balancing (the first being Patch 1). -- Scapes
https://w.atwiki.jp/hira-struts11/pages/14.html
struts1.1/action.ActionServlet ソースコード ActionServlet.java - jakarta-struts-1.1-src/src/share/org/apache/struts/action - Code Search リスト init()
https://w.atwiki.jp/usb_audio/pages/40.html
原文:Audio Device Document 1.0(PDF) USB Device Class Definition for Audio Devices Release 1.0 March 18, 1998 56 Extension Unit is not available in this case because it is bypassed). Default behavior is assumed when set to off. In the case of a single Input Pin, logical channels that enter the Extension Unit are passed unaltered for those channels that are also present in the output cluster. Logical channels not available in the output cluster are absorbed by the Extension Unit. Logical channels present in the output cluster but unavailable in the input cluster are muted. In case of multiple Input Pins, corresponding logical input channels are equally mixed together before being passed to the output. An index to a string descriptor is provided to further describe the Extension Unit. The following table outlines the Extension Unit descriptor. Table 4-15 Extension Unit Descriptor Offset Field Size Value Description 0 bLength 1 Number Size of this descriptor, in bytes 13+p+n 1 bDescriptorType 1 Constant CS_INTERFACE descriptor type. 2 bDescriptorSubtype 1 Constant EXTENSION_UNIT descriptor subtype. 3 bUnitID 1 Number Constant uniquely identifying the Unit within the audio function. This value is used in all requests to address this Unit. 4 wExtensionCode 2 Constant Vendor-specific code identifying the Extension Unit. 6 bNrInPins 1 Number Number of Input Pins of this Unit p 7 baSourceID(1) 1 Number ID of the Unit or Terminal to which the first Input Pin of this Extension Unit is connected. … … … … … 7+(p-1) baSourceID (p) 1 Number ID of the Unit or Terminal to which the last Input Pin of this Extension Unit is connected. 7+p bNrChannels 1 Number Number of logical output channels in the audio channel cluster of the Extension Unit. 7+p+1 wChannelConfig 2 Bitmap Describes the spatial location of the logical channels in the audio channel cluster of the Extension Unit. 7+p+3 iChannelNames 1 Index Index of a string descriptor, describing the name of the first logical channel in the audio channel cluster of the Extension Unit. 11+p bControlSize 1 Number Size, in bytes, of the bmControls field n USB Device Class Definition for Audio Devices Release 1.0 March 18, 1998 57 Offset Field Size Value Description 12+p bmControls n Bitmap A bit set to 1 indicates that the mentioned Control is supported D0 Enable Processing D1..(n*8-1) Reserved 12+p+n iExtension 1 Index Index of a string descriptor, describing this Extension Unit. 4.3.2.8 Associated Interface Descriptor The Associated Interface descriptor provides a means to indicate a relationship between a Terminal or a Unit and an interface, external to the audio function. It directly follows the Entity descriptor to which it is related. The bInterfaceNr field contains the interface number of the associated interface. The remainder of the descriptor depends both on the Entity to which it is related and on the interface class of the target interface. At this moment, no specific layouts are defined by this specification. The following table outlines the Associated Interface descriptor. Table 4-16 Associated Interfaces Descriptor Offset Field Size Value Description 0 bLength 1 Number Size of this descriptor, in bytes 4+x 1 bDescriptorType 1 Constant CS_INTERFACE descriptor type. 2 bDescriptorSubtype 1 Constant ASSOC_INTERFACE descriptor subtype. 3 bInterfaceNr 1 Number The interface number of the associated interface. 4 Association-specific x Number Association-specific extension to the open-ended descriptor. 4.4 AudioControl Endpoint Descriptors The following sections describe all possible endpoint-related descriptors for the AudioControl interface. 4.4.1 AC Control Endpoint Descriptors 4.4.1.1 Standard AC Control Endpoint Descriptor Because endpoint 0 is used as the AudioControl control endpoint, there is no dedicated standard control endpoint descriptor. 4.4.1.2 Class-Specific AC Control Endpoint Descriptor There is no dedicated class-specific control endpoint descriptor. USB Device Class Definition for Audio Devices Release 1.0 March 18, 1998 58 4.4.2 AC Interrupt Endpoint Descriptors 4.4.2.1 Standard AC Interrupt Endpoint Descriptor The interrupt endpoint descriptor is identical to the standard endpoint descriptor defined in Section 9.6.4, “Endpoint,” of the USB Specification and further expanded as defined in the Universal Serial Bus Class Specification. Its fields are set to reflect the interrupt type of the endpoint. This endpoint is optional. The following table outlines the standard AC Interrupt Endpoint descriptor. Table 4-17 Standard AC Interrupt Endpoint Descriptor Offset Field Size Value Description 0 bLength 1 Number Size of this descriptor, in bytes 9 1 bDescriptorType 1 Constant ENDPOINT descriptor type 2 bEndpointAddress 1 Endpoint The address of the endpoint on the USB device described by this descriptor. The address is encoded as follows D7 Direction. 1 = IN endpoint D6..4 Reserved, reset to zero D3..0 The endpoint number, determined by the designer. 3 bmAttributes 1 Bit Map D3..2 Synchronization type 00 = None D1..0 Transfer type 11 = Interrupt All other bits are reserved. 4 wMaxPacketSize 2 Number Maximum packet size this endpoint is capable of sending or receiving when this configuration is selected. Used here to pass 2-byte status information. Set to 2 if not shared, set to the appropriate value if shared. 6 bInterval 1 Number Left to the designer’s discretion. A value of 10 ms or more seems sufficient. 7 bRefresh 1 Number Reset to 0. 8 bSynchAddress 1 Endpoint Reset to 0. 4.4.2.2 Class-Specific AC Interrupt Endpoint Descriptor There is no class-specific AudioControl interrupt endpoint descriptor. 4.5 AudioStreaming Interface Descriptors The AudioStreaming (AS) interface descriptors contain all relevant information to characterize the AudioStreaming interface in full. USB Device Class Definition for Audio Devices Release 1.0 March 18, 1998 59 4.5.1 Standard AS Interface Descriptor The standard AS interface descriptor is identical to the standard interface descriptor defined in Section 9.6.3, “Interface,” of the USB Specification, except that some fields now have dedicated values. Table 4-18 Standard AS Interface Descriptor Offset Field Size Value Description 0 bLength 1 Number Size of this descriptor, in bytes 9 1 bDescriptorType 1 Constant INTERFACE descriptor type 2 bInterfaceNumber 1 Number Number of interface. A zero-based value identifying the index in the array of concurrent interfaces supported by this configuration. 3 bAlternateSetting 1 Number Value used to select an alternate setting for the interface identified in the prior field. 4 bNumEndpoints 1 Number Number of endpoints used by this interface (excluding endpoint 0). 5 bInterfaceClass 1 Class AUDIO Audio Interface Class code (assigned by the USB). See Section A.1, “Audio Interface Class Code.” 6 bInterfaceSubClass 1 Subclass AUDIO_STREAMING Audio Interface Subclass code. Assigned by this specification. See Section A.2, “Audio Interface Subclass Codes.” 7 bInterfaceProtocol 1 Protocol Not used. Must be set to 0. 8 iInterface 1 Index Index of a string descriptor that describes this interface. 4.5.2 Class-Specific AS Interface Descriptor The bTerminalLink field contains the unique Terminal ID of the Input or Output Terminal to which this interface is connected. The bDelay field holds a value that is a measure for the delay that is introduced in the audio data stream due to internal processing of the signal within the audio function. The Host software can take this value into account when phase relations between audio streams, processed by different audio functions, are important. The wFormatTag field holds information about the Audio Data Format that should be used when communicating with this interface. If the interface has a USB isochronous endpoint associated with it, the wFormatTag field describes the Audio Data Format that should be used when exchanging data with this endpoint. If the interface has no endpoint, the wFormatTag field describes the Audio Data Format that is used on the (external) connection this interface represents. This specification defines a number of standard Formats, ranging from Mono 8-bit PCM to MPEG2 7.1 encoded audio streams. A complete list of supported Audio Data Formats is provided in a separate document, USB Audio Data Formats, that is considered part of this specification. Further specific USB Device Class Definition for Audio Devices Release 1.0 March 18, 1998 60 information concerning the Audio Data Format for this interface is reported in a separate type-specific descriptor, see Section 4.5.3, “Class-Specific AS Format Type Descriptor.” This can optionally be supplemented by format-specific information through a format-specific descriptor, see Section 4.5.4, “Class-Specific AS Format-Specific Descriptor.” Table 4-19 Class-Specific AS Interface Descriptor Offset Field Size Value Description 0 bLength 1 Number Size of this descriptor in bytes 7 1 bDescriptorType 1 Constant CS_INTERFACE descriptor type. 2 bDescriptorSubtype 1 Constant AS_GENERAL descriptor subtype. 3 bTerminalLink 1 Constant The Terminal ID of the Terminal to which the endpoint of this interface is connected. 4 bDelay 1 Number Delay (d) introduced by the data path (see Section 3.4, “Inter Channel Synchronization”). Expressed in number of frames. 5 wFormatTag 2 Number The Audio Data Format that has to be used to communicate with this interface. 4.5.3 Class-Specific AS Format Type Descriptor The wFormatTag field in the class-specific AS Interface Descriptor implicitly indicates which Format Type should be used to communicate with the connection (USB or external) this interface represents. (Each Audio Data Format belongs to a certain Format Type as outlined in USB Audio Data Formats.) Each Format Type has a specific Format Type descriptor associated with it. This class-specific AS Format Type descriptor follows the class-specific AS interface descriptor and delivers format type-specific information to the Host. The details and layout of this descriptor for each of the supported Format Types is found in USB Audio Data Formats. 4.5.4 Class-Specific AS Format-Specific Descriptor As stated earlier, the wFormatTag field in the class-specific AS Interface Descriptor not only describes to what Format Type the interface belongs. It also states exactly what Audio Data Format should be used to communicate with the connection (USB or external) this interface represents. Some Audio Data Formats need additional format-specific information conveyed to the Host. Therefore, the Format Type descriptor may be followed by a class-specific AS format-specific descriptor. The details and layout of this descriptor for the Audio Data Formats that need it, is outlined in USB Audio Data Formats. 4.6 AudioStreaming Endpoint Descriptors The following sections describe all possible endpoint-related descriptors for the AudioStreaming interface. 1 - 6 - 11 - 16 - 21 - 26 - 31 - 36 - 41 - 46 - 51 - 56 - 61 - 66 - 71 - 76 - 81 - 86 - 91 - 96 - 101 - 106 - 111 - 116 - 121 - 126 ここを編集
https://w.atwiki.jp/javadsge/pages/8550.html
var x=new Array(); var y=new Array(); var p=new Array(); var koma=new Array(); var namex=new Array(); var dx=new Array(); var dy=new Array(); var number_koma=new Array(); function myFunction() { x[1]=2; y[1]=1; p[1]=1; x[2]=2; y[2]=3; p[2]=2; koma[1]=1; koma[2]=1; namex[1]="王"; id="1aPGk98JxsexKKOm_Ypl_EX3RA-Qs-599VuD-7c-jvvg"; var ex = SpreadsheetApp.openById(id); var sh = ex.getSheetByName("data"); for(s=1;s 3;s++){ sx=koma[s]; sh.getRange(y[s],x[s]).setValue(namex[sx]); } for(s=1;s 2;s++){ dx[s]=new Array(); dy[s]=new Array(); } number_koma[1]=8; dx[1][1]=1; dy[1][1]=0; dx[1][2]=0; dy[1][2]=1; dx[1][3]=0; dy[1][3]=-1; dx[1][4]=-1; dy[1][4]=0; dx[1][5]=1; dy[1][5]=-1; dx[1][6]=-1; dy[1][6]=1; dx[1][7]=1; dy[1][7]=1; dx[1][8]=-1; dy[1][8]=-1; select(2); } function select(sp){ k1=koma[sp]; var s,sx; var ax=new Array(); var ay=new Array(); sx=0; for(s=1;s number_koma[k1]+1;s++){ x2=x[sp]+dx[k1][s]; y2=y[sp]+dy[k1][s]; h=0; if(x2 3)h=100; if(x2 1)h=100; if(y2 3)h=100; if(y2 1)h=100; if(h 50)sx=sx+1; if(h 50)ax[sx]=x2; if(h 50)ay[sx]=y2; } number_sub=sx; Logger.log(number_sub); }
https://w.atwiki.jp/atachi/pages/56.html
COMポートの制御方法一覧 [#p7cc401c] C#によるCOMポートの制御方法 [#cf3722e4] バイナリデータの入出力 [#vc23ef70] 非同期通信 [#ra977492] 文字列エンコーディング [#mf511404] 送受信のタイミング [#e9a806a9] .NET Frameworks2以降でシリアルポートがサポートされています。 SerialPortはSystem.IO.Portsに所属します。 WindowsFormプロジェクトでは、SerialPortコントロールがコンポーネントとして使用できます。 WPFではコントロールとしては存在しませんので、System.IO.Ports.SerialPortクラスのインスタンスをソースコード上で作成して使用します。 COMポートの制御方法一覧 ポートの列挙 string[] ports = SerialPort.GetPortNames(); C#によるCOMポートの制御方法 US-ASCIIをポートの入出力にする場合、WriteLine/ReadLineを使用できます。 // 【ポートのオープン】 // 「COM1」というCOMポートがシステムに存在しているものとする SerialPort port = new SerialPort("COM1", 9600, Parity.None, 8, StopBits.One); port.Open(); port.DtrEnable = true; port.RtsEnable = true; // 【ポートへの書き込み】 port.WriteLine("Hello"); // WriteLineでは、指定した文字列の末尾に // 自動的に改行コード(※1)が追加される // 【ポートからの読み込み】 string read = port.ReadLine(); // ReadLineでは、文字列をポートから読み込む。 // ポートからのデータに、改行コードを検出すると、 // 改行コード(※1)を削除したものを返値する。 // 【ポートのクローズ】 port.Close(); // ※1 改行コードは「CLRF = 0x0d 0x0a」(2バイト) ちなみに、SerialPortクラスはIDisposableインターフェースを提供しているので、C#の流儀に従うなら次のように記述するほうが安全。 using(SerialPort port = new SerialPort("COM1", 9600, Parity.None, 8, StopBits.One) ) { port.Open(); }// port.Close()は不要 バイナリデータの入出力 SerialPort.Read または SerialPort.Write を使用してバイナリデータの入出力ができます。 非同期通信 イベントを使うことでポートにデータが到達した場合にリスナーハンドラを呼び出すことができます。 // portはSerialPortオブジェクト port.DataReceived += new SerialDataReceivedEventHandler(OnSerialDataReceived); void OnSerialDataReceived(object sender,SerialDataReceivedEventArgs e) { // 受信時の処理 SerialPort s = sender as SerialPort; } 文字列エンコーディング COMポートから日本語を受け取ることもできます。 C#では文字列をstring型として扱い、この型にはエンコード情報が含まれているため、C#では複数のエンコードを1つの型で表現できるのです。 COMポートから送られてくる日本語のエンコード形式を指定することで、SerialPort.ReadLine()は適切なエンコード処理を行い、string型を返すことができるのです。 // portはSerialPortオブジェクト port.Encoding = Encoding.GetEncoding("Shift_JIS"); // ReadLineで取得する文字列のエンコード形式がShiftJISであることを指 送受信のタイミング SerialPortのメソッドを使用している限り、そのメソッドを呼び出した直後にポートによりデータが送受信されます。 ただし、送受信に失敗するケースがあります。ハンドシェークを使用してCOMポートを制御している場合に、デバイスが準備できていないことを検出するなどの原因で直ちに送受信が行えないなどです。 このような場合、SerialPort.ReadやSerialPort.Writeは例外をスローします。
https://w.atwiki.jp/usb_audio/pages/32.html
原文:Audio Device Document 1.0(PDF) USB Device Class Definition for Audio Devices Release 1.0 March 18, 1998 21 · Resolution attribute As an example, consider a Volume Control inside a Feature Unit. By issuing the appropriate Get requests, the Host software can obtain values for the Volume Control’s attributes and, for instance, use them to correctly display the Control on the screen. Setting the Volume Control’s current attribute allows the Host software to change the volume setting of the Volume Control. Additionally, each Entity (Unit or Terminal) in an audio function can have a memory space attribute. This attribute optionally provides generic access to the internal memory space of the Entity. This could be used to implement vendor-specific control of an Entity through generically provided access. 3.5.1 Input Terminal The Input Terminal (IT) is used to interface between the audio function’s ‘outside world’ and other Units in the audio function. It serves as a receptacle for audio information flowing into the audio function. Its function is to represent a source of incoming audio data after this data has been properly extracted from the original audio stream into the separate logical channels that are embedded in this stream (the decoding process). The logical channels are grouped into an audio channel cluster and leave the Input Terminal through a single Output Pin. An Input Terminal can represent inputs to the audio function other than USB OUT endpoints. A Line-In connector on an audio device is an example of such a non-USB input. However, if the audio stream is entering the audio function by means of a USB OUT endpoint, there is a one-to-one relationship between that endpoint and its associated Input Terminal. The class-specific endpoint descriptor contains a field that holds a direct reference to this Input Terminal. The Host needs to use both the endpoint descriptors and the Input Terminal descriptor to get a full understanding of the characteristics and capabilities of the Input Terminal. Stream-related parameters are stored in the endpoint descriptors. Control-related parameters are stored in the Terminal descriptor. The conversion process from incoming, possibly encoded audio streams to logical audio channels always involves some kind of decoding engine. This specification defines several types of decoding. These decoding types range from rather trivial decoding schemes like converting interleaved stereo 16 bit PCM data into a Left and Right logical channel to very sophisticated schemes like converting an MPEG-2 7.1 encoded audio stream into Left, Left Center, Center, Right Center, Right, Right Surround, Left Surround and Low Frequency Enhancement logical channels. The decoding engine is considered part of the Entity that actually receives the encoded audio data streams (like a USB AudioStreaming interface). The type of decoding is therefore implied in the wFormatTag value, located in the AudioStreaming interface descriptor. Requests specific to the decoding engine must be directed to the AudioStreaming interface. The associated Input Terminal deals with the logical channels after they have been decoded. The symbol for the Input Terminal is depicted in the following figure ここに画像 Figure 3-1 Input Terminal Icon 3.5.2 Output Terminal The Output Terminal (OT) is used to interface between Units inside the audio function and the ‘outside world’. It serves as an outlet for audio information, flowing out of the audio function. Its function is to represent a sink of outgoing audio data before this data is properly packed from the original separate logical channels into the outgoing audio stream (the encoding process). The audio channel cluster enters the Output Terminal through a single Input Pin. USB Device Class Definition for Audio Devices Release 1.0 March 18, 1998 22 An Output Terminal can represent outputs from the audio function other than USB IN endpoints. A speaker built into an audio device or a Line Out connector is an example of such a non-USB output. However, if the audio stream is leaving the audio function by means of a USB IN endpoint, there is a oneto- one relationship between that endpoint and its associated Output Terminal. The class-specific endpoint descriptor contains a field that holds a direct reference to this Output Terminal. The Host needs to use both the endpoint descriptors and the Output Terminal descriptor to fully understand the characteristics and capabilities of the Output Terminal. Stream-related parameters are stored in the endpoint descriptors. Control-related parameters are stored in the Terminal descriptor. The conversion process from incoming logical audio channels to possibly encoded audio streams always involves some kind of encoding engine. This specification defines several types of encoding, ranging from rather trivial to very sophisticated schemes. The encoding engine is considered part of the Entity that actually transmits the encoded audio data streams (like a USB AudioStreaming interface). The type of encoding is therefore implied in the wFormatTag value, located in the AudioStreaming interface descriptor. Requests specific to the encoding engine must be directed to the AudioStreaming interface. The associated Output Terminal deals with the logical channels before encoding. The symbol for the Output Terminal is depicted in the following figure ここに画像 Figure 3-2 Output Terminal Icon 3.5.3 Mixer Unit The Mixer Unit (MU) transforms a number of logical input channels into a number of logical output channels. The input channels are grouped into one or more audio channel clusters. Each cluster enters the Mixer Unit through an Input Pin. The logical output channels are grouped into one audio channel cluster and leave the Mixer Unit through a single Output Pin. Every input channel can virtually be mixed into all of the output channels. If n is the total number of input channels and m is the number of output channels, then there are n x m mixing Controls in the Mixer Unit. Not all of these Controls have to be physically implemented. Some Controls can have a fixed setting and be non-programmable. The Mixer Unit Descriptor reports which Controls are programmable in the bmControls bitmap field. Using this model, a permanent connection can be implemented by reporting the Control as non-programmable and by returning a Control setting of 0 dB when requested. Likewise, a missing connection can be implemented by reporting the Control as non-programmable and by returning a Control setting of -¥ dB. The symbol for the Mixer Unit can be found in the following figure ここに画像 Figure 3-3 Mixer Unit Icon 3.5.4 Selector Unit The Selector Unit (SU) selects from n audio channel clusters, each containing m logical input channels and routes them unaltered to the single output audio channel cluster, containing m output channels. It USB Device Class Definition for Audio Devices Release 1.0 March 18, 1998 23 represents a multi-channel source selector, capable of selecting between n m-channel sources. It has n Input Pins and a single Output Pin. The symbol for the Selector Unit can be found in the following figure ここに画像 Figure 3-4 Selector Unit Icon 3.5.5 Feature Unit The Feature Unit (FU) is essentially a multi-channel processing unit that provides basic manipulation of the incoming logical channels. For each logical channel, the Feature Unit optionally provides audio Controls for the following features · Volume · Mute · Tone Control (Bass, Mid, Treble) · Graphic Equalizer · Automatic Gain Control · Delay · Bass Boost · Loudness In addition, the Feature Unit optionally provides the above audio Controls but now influencing all channels of the cluster at once. In this way, ‘master’ Controls can be implemented. The master Controls are cascaded after the individual channel Controls. This setup is especially useful in multi-channel systems where the individual channel Controls can be used for channel balancing and the master Controls can be used for overall settings. The logical channels in the cluster are numbered from one to the total number of channels in the cluster. The ‘master’ channel has channel number zero and is always virtually present. The Feature Unit Descriptor reports which Controls are present for every channel in the Feature Unit and for the ‘master’ channel. All logical channels in a Feature Unit are fully independent. There exist no cross couplings among channels within the Feature Unit. There are as many logical output channels, as there are input channels. These are grouped into one audio channel cluster that enters the Feature Unit through a single Input Pin and leaves the Unit through a single Output Pin. The symbol for the Feature Unit is depicted in the following figure ここに画像 Figure 3-5 Feature Unit Icon 3.5.6 Processing Unit The Processing Unit (PU) represents a functional block inside the audio function that transforms a number of logical input channels, grouped into one or more audio channel clusters into a number of logical output channels, grouped into one audio channel cluster. Therefore, the Processing Unit can have multiple Input USB Device Class Definition for Audio Devices Release 1.0 March 18, 1998 24 Pins and has a single Output Pin. This specification defines several standard transforms (algorithms) that are considered necessary to support additional audio functionality; these transforms are not covered by the other Unit types but are commonplace enough to be included in this specification so that a generic driver can provide control for it. Processing Units are encouraged to support at least the Enable Processing Control, allowing the Host software to bypass whatever functionality is incorporated in the Processing Unit. 3.5.6.1 Up/Down-mix Processing Unit The Up/Down-mix Processing Unit provides facilities to derive m output audio channels from n input audio channels. The algorithms and transforms applied to accomplish this are not defined by this specification and can be proprietary. The input channels are grouped into one input channel cluster that enters the Processing Unit over a single Input Pin. Likewise, all output channels are grouped into one output channel cluster, leaving the Processing Unit over a single Output Pin. The Up/Down-mix Processing Unit can support multiple modes of operation (besides the bypass mode, controlled by the Enable Processing Control). The available input audio channels are dictated by the Unit or Terminal to which the Up/Down-mix Processing Unit is connected. The Up/Down-mix Processing Unit descriptor reports which up/down-mixing modes the Unit supports through its waModes() array. Each element of the waModes() array indicates which output channels in the output cluster are effectively used in a particular mode. The unused output channels in the output cluster must produce muted output. Mode selection is implemented using the Get/Set Control request. As an example, consider the case where an Up/Down-mix Processing Unit is connected to an Input Terminal, producing DolbyÔ AC-3 5.1 decoded audio. The input audio channel cluster to the Up/Downmix Processing Unit therefore contains Left, Right, Center, Left Surround, Right Surround and LFE logical channels. Suppose the audio function’s hardware is limited to reproducing only dual channel audio. Then the Up/Down-mix Processing Unit could use some (sophisticated) algorithms to down-mix the available spatial audio information into two (‘enriched’) channels so that the maximum spatial effects can be experienced, using only two channels. It is left to the audio function’s discretion to use the appropriate down-mix algorithm depending on the physical nature of the Output Terminal to which the Up/Down-mix Processing Unit is routed. For instance, a different down-mix algorithm is needed whether the ‘enriched’ stereo stream is sent to a pair of speakers or to a headphone set. However, this knowledge already resides within the audio function and deciding which down-mix algorithm to use does not need Host intervention. As a second interesting example, suppose the hardware is capable of servicing eight discrete audio channels for instance a full-fledged MPEG-2 7.1 system. Now the Up/Down-mix Processing Unit could use certain techniques to derive meaningful content for the extra audio channels (Left of Center, Right of Center) that are present in the output cluster and are missing in the input channel cluster (AC-3 5.1). This is a typical example of an up-mix situation. The symbol for the Up/Down-mix Processing Unit is depicted in the following figure ここに画像 Figure 3-6 Up/Down-mix Processing Unit Icon 3.5.6.2 Dolby Prologic Processing Unit The Dolby PrologicÔ decoding process can be seen as an operator on the Left and Right logical channels of the input cluster of the Unit. It is capable of extracting additional audio data (Center and/or Surround USB Device Class Definition for Audio Devices Release 1.0 March 18, 1998 25 channels) from information that is transparently ‘superimposed’ on the Left and Right audio channels. It therefore differs from a true decoding process as defined for an Input Terminal. It can be applied on a logical audio stream anywhere in the audio function. The Dolby Prologic Processing Unit is a specialized derivative of the Up/Down-mix Processing Unit. The Dolby Prologic Processing Unit can have the following modes of operation (besides the bypass mode, controlled by the Enable Processing Control) · Left, Right, Center channel decoding · Left, Right, Surround channel decoding · Left, Right, Center, Surround decoding The Dolby Prologic Processing Unit descriptor reports which modes the Unit supports. Mode selection is then implemented using the Get/Set Control request. Dolby Prologic Surround Delay Control is considered not to be part of the Dolby PrologicÔ Processing Unit and must be handled by a separate Feature Unit. Dolby Prologic Bass Management is the local responsibility of the audio function and should not be controllable from the Host. The symbol for the Dolby Prologic Processing Unit can be found in the following picture ここに画像 Figure 3-7 Dolby Prologic Processing Unit Icon 3.5.6.3 3D-Stereo Extender Processing Unit The 3D-Stereo Extender Processing Unit operates on Left and Right channels only. It processes an existing stereo (two channel) soundtrack to add spaciousness and to make it appear to originate from outside the Left/Right speaker locations. Extended stereo effects can be achieved via various, straightforward methods. The algorithms and transforms applied to accomplish this are not defined by this specification and can be proprietary. The effects of the 3D-Stereo Extender Processing Unit can be bypassed at all times through manipulation of the Enable Processing Control. The size of the listening area (area in which the listener has to be placed with respect to speakers to hear the effect, also called sweet spot) can be controlled using the proper Get/Set Control request. The symbol for the 3D-Stereo Extender Unit is depicted in the following figure ここに画像 Figure 3-8 3D-Stereo Extender Processing Unit Icon 3.5.6.4 Reverberation Processing Unit The Reverberation Processing Unit is used to add room acoustics effects to the original audio information. These effects can range from small room reverberation effects to simulation of a large concert hall reverberation. A number of parameters can be manipulated to obtain the desired reverberation effects. · Reverb Type Room1, Room2, Room3, Hall1, Hall2, Plate, Delay, and Panning Delay. 1 - 6 - 11 - 16 - 21 - 26 - 31 - 36 - 41 - 46 - 51 - 56 - 61 - 66 - 71 - 76 - 81 - 86 - 91 - 96 - 101 - 106 - 111 - 116 - 121 - 126 ここを編集
https://w.atwiki.jp/bgmfes/pages/40.html
関係者 B.G.M Festival(美郷あきオフィシャルブログ) B.G.M Festival Vol.0(yozucamera*) BGMフェス Vol.0(ゆりあぶろぐ) Burst dream ! 後ろに夢なんかない(えんちゃんねる) B.G.M(AiRIのブログ) B.G.M. fes Vol.0ありがとう!!!(モモブロ) B.G.M festival/続・B.G.M FESTIVAL/続々B.G.M Festival(メジャー・オア・アンダーグラウンド) プレス 桃井はるこ、佐藤ひろ美らが熱唱!『B.G.Mフェスティバル Vol.0』レポート!(じゃぽかる) 参加者 「B.G.M Festival Vol.0」レポ #bgm_fes(Dolphan日記帳) B.G.M Festival Vol.0@品川ステラボール(無限迷路) B.G.M Festival Vol.0(おもてのガラクタ日記) B.G.M Festival Vol.0(ひとりこすりんぴっく) 「B.G.M Festival Vol.0」感想(あっち向いてこっち向いて) B.G.M Festival vol.0(Tightrope Dancer) B.G.M Festival Vol.0(みづなきそら) B.G.M Festival Vol.0(アニソンとパソゲーとその他モロモロ) B.G.M Festival Vol.0 感想(かねぴ~の自堕落日記) 「B.G.M Festival Vol.0」ライブレポート。美少女ゲーム好きとして初心に戻れました!(MA-SAブログ) B.G.M Festival Vol.0(青空の下で) B.G.M Festival Vol.0とかエウくじとかサンクリとか(みちょー牧場) B.G.M Festival Vol.0(The hymn to dream) B.G.M Festival Vol.0 無事終了おめでとう!(Azure) B.G.M Festivalお疲れ様でしたー(二次嫁に賛辞を) B.G.M Festival Vol.0 レポ(徹夜会会長の脳内World) B.G.M.Festival vol.0に参加してきました(amamistの日記) B.G.M Festival Vol.0 ~そして伝説へ~(萃-sui-) BGMフェス(ハニワ亭) B.G.M.フェスVol.0 ニコ生 でみたよ(どどどの日誌) B.G.M Festival Vol.0(2次元とコーラと俺)
https://w.atwiki.jp/touhoukashi/pages/3166.html
【登録タグ C Liz Triangle lily-an ポーカーフェイサー 妖魔夜行 少女綺想曲 ~ Dream Battle 曲】 【注意】 現在、このページはJavaScriptの利用が一時制限されています。この表示状態ではトラック情報が正しく表示されません。 この問題は、以下のいずれかが原因となっています。 ページがAMP表示となっている ウィキ内検索からページを表示している これを解決するには、こちらをクリックし、ページを通常表示にしてください。 /** General styling **/ @font-face { font-family Noto Sans JP ; font-display swap; font-style normal; font-weight 350; src url(https //img.atwikiimg.com/www31.atwiki.jp/touhoukashi/attach/2972/10/NotoSansCJKjp-DemiLight.woff2) format( woff2 ), url(https //img.atwikiimg.com/www31.atwiki.jp/touhoukashi/attach/2972/9/NotoSansCJKjp-DemiLight.woff) format( woff ), url(https //img.atwikiimg.com/www31.atwiki.jp/touhoukashi/attach/2972/8/NotoSansCJKjp-DemiLight.ttf) format( truetype ); } @font-face { font-family Noto Sans JP ; font-display swap; font-style normal; font-weight bold; src url(https //img.atwikiimg.com/www31.atwiki.jp/touhoukashi/attach/2972/13/NotoSansCJKjp-Medium.woff2) format( woff2 ), url(https //img.atwikiimg.com/www31.atwiki.jp/touhoukashi/attach/2972/12/NotoSansCJKjp-Medium.woff) format( woff ), url(https //img.atwikiimg.com/www31.atwiki.jp/touhoukashi/attach/2972/11/NotoSansCJKjp-Medium.ttf) format( truetype ); } rt { font-family Arial, Verdana, Helvetica, sans-serif; } /** Main table styling **/ #trackinfo, #lyrics { font-family Noto Sans JP , sans-serif; font-weight 350; } .track_number { font-family Rockwell; font-weight bold; } .track_number after { content . ; } #track_args, .amp_text { display none; } #trackinfo { position relative; float right; margin 0 0 1em 1em; padding 0.3em; width 320px; border-collapse separate; border-radius 5px; border-spacing 0; background-color #F9F9F9; font-size 90%; line-height 1.4em; } #trackinfo th { white-space nowrap; } #trackinfo th, #trackinfo td { border none !important; } #trackinfo thead th { background-color #D8D8D8; box-shadow 0 -3px #F9F9F9 inset; padding 4px 2.5em 7px; white-space normal; font-size 120%; text-align center; } .trackrow { background-color #F0F0F0; box-shadow 0 2px #F9F9F9 inset, 0 -2px #F9F9F9 inset; } #trackinfo td ul { margin 0; padding 0; list-style none; } #trackinfo li { line-height 16px; } #trackinfo li nth-of-type(n+2) { margin-top 6px; } #trackinfo dl { margin 0; } #trackinfo dt { font-size small; font-weight bold; } #trackinfo dd { margin-left 1.2em; } #trackinfo dd + dt { margin-top .5em; } #trackinfo_help { position absolute; top 3px; right 8px; font-size 80%; } /** Media styling **/ #trackinfo .media th { background-color #D8D8D8; padding 4px 0; font-size 95%; text-align center; } .media td { padding 0 2px; } .media iframe nth-of-type(n+2) { margin-top 0.3em; } .youtube + .nicovideo, .youtube + .soundcloud, .nicovideo + .soundcloud { margin-top 0.75em; } .media_section { display flex; align-items center; text-align center; } .media_section before, .media_section after { display block; flex-grow 1; content ; height 1px; } .media_section before { margin-right 0.5em; background linear-gradient(-90deg, #888, transparent); } .media_section after { margin-left 0.5em; background linear-gradient(90deg, #888, transparent); } .media_notice { color firebrick; font-size 77.5%; } /** Around track styling **/ .next-track { float right; } /** Infomation styling **/ #trackinfo .info_header th { padding .3em .5em; background-color #D8D8D8; font-size 95%; } #trackinfo .infomation_show_btn_wrapper { float right; font-size 12px; user-select none; } #trackinfo .infomation_show_btn { cursor pointer; } #trackinfo .info_content td { padding 0 0 0 5px; height 0; transition .3s; } #trackinfo .info_content ul { padding 0; margin 0; max-height 0; list-style initial; transition .3s; } #trackinfo .info_content li { opacity 0; visibility hidden; margin 0 0 0 1.5em; transition .3s, opacity .2s; } #trackinfo .info_content.infomation_show td { padding 5px; height 100%; } #trackinfo .info_content.infomation_show ul { padding 5px 0; max-height 50em; } #trackinfo .info_content.infomation_show li { opacity 1; visibility visible; } #trackinfo .info_content.infomation_show li nth-of-type(n+2) { margin-top 10px; } /** Lyrics styling **/ #lyrics { font-size 1.06em; line-height 1.6em; } .not_in_card, .inaudible { display inline; position relative; } .not_in_card { border-bottom dashed 1px #D0D0D0; } .tooltip { display flex; visibility hidden; position absolute; top -42.5px; left 0; width 275px; min-height 20px; max-height 100px; padding 10px; border-radius 5px; background-color #555; align-items center; color #FFF; font-size 85%; line-height 20px; text-align center; white-space nowrap; opacity 0; transition 0.7s; -webkit-user-select none; -moz-user-select none; -ms-user-select none; user-select none; } .inaudible .tooltip { top -68.5px; } span hover + .tooltip { visibility visible; top -47.5px; opacity 0.8; transition 0.3s; } .inaudible span hover + .tooltip { top -73.5px; } .not_in_card span.hide { top -42.5px; opacity 0; transition 0.7s; } .inaudible .img { display inline-block; width 3.45em; height 1.25em; margin-right 4px; margin-bottom -3.5px; margin-left 4px; background-image url(https //img.atwikiimg.com/www31.atwiki.jp/touhoukashi/attach/2971/7/Inaudible.png); background-size contain; background-repeat no-repeat; } .not_in_card after, .inaudible .img after { content ; visibility hidden; position absolute; top -8.5px; left 42.5%; border-width 5px; border-style solid; border-color #555 transparent transparent transparent; opacity 0; transition 0.7s; } .not_in_card hover after, .inaudible .img hover after { content ; visibility visible; top -13.5px; left 42.5%; opacity 0.8; transition 0.3s; } .not_in_card after { top -2.5px; left 50%; } .not_in_card hover after { top -7.5px; left 50%; } .not_in_card.hide after { visibility hidden; top -2.5px; opacity 0; transition 0.7s; } /** For mobile device styling **/ .uk-overflow-container { display inline; } #trackinfo.mobile { display table; float none; width 100%; margin auto; margin-bottom 1em; } #trackinfo.mobile th { text-transform none; } #trackinfo.mobile tbody tr not(.media) th { text-align left; background-color unset; } #trackinfo.mobile td { white-space normal; } document.addEventListener( DOMContentLoaded , function() { use strict ; const headers = { title アルバム別曲名 , album アルバム , circle サークル , vocal Vocal , lyric Lyric , chorus Chorus , narrator Narration , rap Rap , voice Voice , whistle Whistle (口笛) , translate Translation (翻訳) , arrange Arrange , artist Artist , bass Bass , cajon Cajon (カホン) , drum Drum , guitar Guitar , keyboard Keyboard , mc MC , mix Mix , piano Piano , sax Sax , strings Strings , synthesizer Synthesizer , trumpet Trumpet , violin Violin , original 原曲 , image_song イメージ曲 }; const rPagename = /(?=^|.*
https://w.atwiki.jp/visualstudio/pages/36.html
C#3.0ではLINQが追加されました。 それに伴い、LINQの可読性を上げるための機能もC#3.0には追加されています。 目次 暗黙的に型指定されたローカル変数 暗黙的に型指定された配列 オブジェクト初期化子 コレクション初期化子 自動プロパティ 匿名型 拡張メソッド ラムダ式 LINQ 暗黙的に型指定されたローカル変数 明示的に型を指定することなく、ローカル変数を宣言することができようになった varキーワードを使用する コンパイル時に適切な型が割り当たる フィールドでは使用不可 VB6のVariantと異なり、変数宣言時に初期化が必要 VB6のVariantと異なり、型変換は出来ない 型名の冗長さを省く場合や、匿名型を使用した場合に用いる class Program { //フィールドでは使用不可 //var a = 123; static void Main(string[] args) { var i = 123; var d = 12.3M; var str = "abc"; var dt = DateTime.Now; //変数宣言時に初期化が必要 //var b; //型変換は出来ない //i = "abc"; //型の冗長さを省く場合 var obj1 = new Myclass(); //匿名型を使用した場合 var obj2 = new { Name = "abc" }; } } class Myclass { } 暗黙的に型指定された配列 newで配列を作成する際、newの後ろの型が省略可能となった コンパイル時に{} の中身から適切な型が割り当たる 複数の型が含まれる場合、コンパイルエラーとなる class Program { static void Main(string[] args) { var i = new[] { 1, 2, 3, }; var d = new[] { 1.1, 2.2, 3.3 }; var str = new[] { "abc", "def", "ghi" }; //複数の型が含まれる場合、コンパイルエラーとなる //var obj = new[] { 1, "abc", 2 }; //匿名型を使用した場合 var persons = new[]{ new{Name = "Taro",Age = 20}, new{Name = "Jiro",Age = 18} }; } } オブジェクト初期化子 オブジェクトの初期化が簡潔に記述可能となった 中カッコ{ }内で初期化を行う class Program { static void Main(string[] args) { //C#3.0の場合 var person3 = new Person { Age = 20, Name = "Taro" }; //C#2.0の場合 Person person2 = new Person(); person2.Age = 20; person2.Name = "Taro"; } } class Person { public string m_name; public int m_age; public string Name { get { return m_name; } set { m_name = value; } } public int Age { get { return m_age; } set { m_age = value; } } } コレクション初期化子 コレクションの初期化が簡潔に記述可能となった 中カッコ{ }内で初期化を行う class Program { static void Main(string[] args) { //C#3.0の場合 var list3 = new List int { 1, 2, 3 }; var dic3 = new Dictionary string, int () { { "a", 1 }, { "b", 2 }, { "c", 3 } }; //C#2.0の場合 List int list2 = new List int (); list2.Add(1); list2.Add(2); list2.Add(3); Dictionary string, int dic2 = new Dictionary string, int (); dic2.Add("a", 1); dic2.Add("b", 2); dic2.Add("c", 3); } } 自動プロパティ プロパティの記述が簡潔に記述可能となった コンパイル時、従来のプロパティが自動的に生成される get、set 両方を記述する必要がある 読み取り専用のプロパティを作成するには、プライベートな set を用意する class Person { public string Name { get; set; } } 上記クラスをコンパイルしReflector for.NETを使用し生成ファイルを見ると、以下のフィールド、プロパティが自動生成されていることがわかる internal class Person { // Fields private string Name k__BackingField; // Properties public string Name { get { return this. Name k__BackingField; } set { this. Name k__BackingField = value; } } } 匿名型 クラスを別途定義せずにオブジェクト生成が可能となった 匿名型は、プログラマからはクラス名が不明のためvarキーワードを使用する コンパイル時、クラスが自動的に作成される 同じ名前、同じ型、同じ並び順のプロパティを持つ匿名型は同一のクラスとなる 主にLINQのselect旬で使用される機能 class Program { static void Main(string[] args) { var obj1 = new { Name = "Taro", Age = 20 }; var obj2 = new { Name = "Jiro", Age = 18 }; if (obj1.GetType() == obj2.GetType()) { Console.WriteLine("同じ型"); } } } 上記クラスをコンパイルしReflector for.NETを使用し生成ファイルを見ると、以下のクラスが自動生成されていることがわかる internal sealed class f__AnonymousType0 Name j__TPar, Age j__TPar { // Fields private readonly Age j__TPar Age i__Field; private readonly Name j__TPar Name i__Field; // Methods public f__AnonymousType0( Name j__TPar Name, Age j__TPar Age) { this. Name i__Field = Name; this. Age i__Field = Age; } // Properties public Age j__TPar Age { get { return this. Age i__Field; } } public Name j__TPar Name { get { return this. Name i__Field; } } } 拡張メソッド 既存クラスを継承することなく、既存クラスにインスタンスメソッドを追加出来る機能 非ジェネリックのstaticクラス内にてstaticメソッドで宣言する必要がある this修飾子にて、拡張メソッドを追加する型を指定する 拡張メソッドを含む名前空間をusing文で指定すると有効となる //拡張メソッドを含む名前空間を指定すると有効となる using B; namespace A { class Program { static void Main(string[] args) { //拡張メソッドの呼び出し "abc".Print(); //通常のstaticメソッドとしての呼び出しも可能 StingExtensions.Print("abc"); } } } namespace B { static class StingExtensions { //拡張メソッド public static void Print(this string str) { System.Console.WriteLine(str); } } } 拡張メソッドとインスタンスメソッドが重複した場合、インスタンスメソッドが優先される namespace A { class Program { static void Main(string[] args) { X obj = new X(); //クラスXのMethod()が呼ばれる obj.Method(); } } class X { public void Method() { } } static class Y { public static void Method(this X x) { } } } ラムダ式 ラムダ式は、式とステートメントを含めることができる匿名関数であり、デリゲート型または式ツリー型を作成するために使用される 演算子 = を使用する 演算子 = の左辺で入力パラメータを指定し、右辺で式、またはステートメントを指定する デリゲート型を作成する場合 namespace LambdaExpression { delegate int D(int x, int y); class Program { static void Main(string[] args) { //C#1.0 //外部メソッドを用意する必要があった D d1 = new D(Add); //C#2.0 //匿名メソッド使用 { }内に直接記述可能となった D d2 = delegate(int x, int y) { return x + y; }; //C#3.0 //ラムダ式使用 匿名メソッドより簡略し記述可能となった D d3 = (x, y) = x + y; //以下の記述も可 //D d3 = (int x, int y) = { return x + y; }; //D d3 = (int x, int y) = return x + y; //D d3 = (int x, int y) = x + y; Console.WriteLine(d1(1, 2)); Console.WriteLine(d2(1, 2)); Console.WriteLine(d3(1, 2)); } static int Add(int x, int y) { return x + y; } } } 式ツリー型を作成する場合 using System.Linq.Expressions; namespace LambdaExpression { class Program { static void Main(string[] args) { //ラムダ式をデリゲートに代入すると匿名メソッドとなる Func int, int, int f = (x, y) = x + y; Console.WriteLine(f(1, 2)); //ラムダ式をExpression式に代入すると式ツリー型となる Expression Func int, int, int e = (x, y) = x + y; var bin = (BinaryExpression)e.Body; var p1 = (ParameterExpression)bin.Left; var p2 = (ParameterExpression)bin.Right; Console.WriteLine(bin); Console.WriteLine(p1); Console.WriteLine(p2); } } } LINQ LINQは、Language Integrated Query(統合言語クエリ)の略 LINQにより、異なる種類のデータに対して統一したクエリ構文でアクセスが可能となった LINQにより、C#、VB.NETのコード内にクエリを記述することが出来る LINQにより、コンパイル時の型チェック、IntelliSenseの使用がクエリに対し可能となった 扱えるデータは主に以下となるオブジェクト(LINQ to Objects)XML(LINQ to XML)ADO.NETのDataSet(LINQ to DataSet)SQLサーバのデータベース(LINQ to SQL)ADO.NET Entity Frameworkから提供される概念エンティティ(LINQ to Entities) C$3.0でのLINQ to Objectsの例 class Program { static void Main(string[] args) { var persons = new[] { new { Name="Taro", Age=20 }, new { Name="Jiro", Age=18 } }; var adults = from person in persons where person.Age = 20 select new { person.Name, person.Age }; foreach (var adult in adults) { Console.WriteLine(adult.Name + "," + adult.Age); } } } C#2.0で同様のコードを記述した場合 class Program { static void Main(string[] args) { List Person persons = new List Person (); persons.Add(new Person("Taro", 20)); persons.Add(new Person("Jiro", 18)); List Person adults = new List Person (); foreach (Person person in persons) { if (person.Age = 20) { Person adult = new Person(person.Name, person.Age); adults.Add(adult); } } foreach (Person adult in adults) { Console.WriteLine(adult.Name + "," + adult.Age); } } } class Person { public string m_name; public int m_age; public Person(string name, int age) { m_name = name; m_age = age; } public string Name { get { return m_name; } set { m_name = value; } } public int Age { get { return m_age; } set { m_age = value; } } } 参考 C# 3.0 の概要 http //www.microsoft.com/japan/msdn/net/bb308966.aspx
https://w.atwiki.jp/wnt0/pages/29.html
#include iostream using namespace std; class Product { public void Operation() { cout "Product." endl; } }; class FactoryImple { public virtual Product* FactoryMethod() { return new Product; } }; class Factory { public Factory(FactoryImple *factory) { m_factory = factory; m_prod = NULL; } ~Factory() { if (m_prod != NULL) { delete m_prod; cout "Product deleted." endl; } } virtual Product* CreateProduct() { Product *prod = m_factory- FactoryMethod(); m_prod = prod;// 何回も呼ばれるときの対応が必要 return prod; } private FactoryImple *m_factory; Product *m_prod; }; int main() { FactoryImple *fac_imple = new FactoryImple; Factory *factory = new Factory(fac_imple); Product *prod = factory- CreateProduct(); prod- Operation(); delete fac_imple; delete factory;// ここで prod も delete される return 0; } 参考サイト デザインパターンを“喩え話”で分かり易く理解する http //www.netlaputa.ne.jp/~hijk/study/oo/designpattern.html TECHSCORE http //www.techscore.com/tech/DesignPattern/index.html/ Programing Place http //www.geocities.jp/ky_webid/index_old.html デザインパターンの骸骨たち http //www002.upp.so-net.ne.jp/ys_oota/mdp/